Learning in high-dimensional feature spaces using ANOVA-based fast matrix-vector multiplication

نویسندگان

چکیده

<p style='text-indent:20px;'>Kernel matrices are crucial in many learning tasks such as support vector machines or kernel ridge regression. The matrix is typically dense and large-scale. Depending on the dimension of feature space even computation all its entries reasonable time becomes a challenging task. For cost matrix-vector product scales quadratically with dimensionality <inline-formula><tex-math id="M1">\begin{document}$ N $\end{document}</tex-math></inline-formula>, if no customized methods applied. We propose use an ANOVA kernel, where we construct several kernels based lower-dimensional spaces for which provide fast algorithms realizing products. employ non-equispaced Fourier transform (NFFT), linear complexity fixed accuracy. Based grouping approach, then show how products can be embedded into method choosing regression conjugate gradient solver. illustrate performance our approach data sets.</p>

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast QMC Matrix-Vector Multiplication

Quasi-Monte Carlo (QMC) rules 1/N ∑N−1 n=0 f(ynA) can be used to approximate integrals of the form ∫ [0,1]s f(yA) dy, where A is a matrix and y is row vector. This type of integral arises for example from the simulation of a normal distribution with a general covariance matrix, from the approximation of the expectation value of solutions of PDEs with random coefficients, or from applications fr...

متن کامل

Graphical Model-Based Learning in High Dimensional Feature Spaces

Digital media tend to combine text and images to express richer information, especially on image hosting and online shopping websites. This trend presents a challenge in understanding the contents from different forms of information. Features representing visual information are usually sparse in high dimensional space, which makes the learning process intractable. In order to understand text an...

متن کامل

Fast construction of hierarchical matrix representation from matrix-vector multiplication

We develop a hierarchical matrix construction algorithm using matrix–vector multiplications, based on the randomized singular value decomposition of low-rank matrices. The algorithm uses OðlognÞ applications of the matrix on structured random test vectors and Oðn lognÞ extra computational cost, where n is the dimension of the unknown matrix. Numerical examples on constructing Green’s functions ...

متن کامل

High Dimensional Data Clustering Using Fast Cluster Based Feature Selection

Feature selection involves identifying a subset of the most useful features that produces compatible results as the original entire set of features. A feature selection algorithm may be evaluated from both the efficiency and effectiveness points of view. While the efficiency concerns the time required to find a subset of features, the effectiveness is related to the quality of the subset of fea...

متن کامل

Learning from Limited Demonstrations in High Dimensional Feature Spaces

Reinforcement learning (RL) has recently gained a lot of popularity partially due to the success of deep Q-learning (DQN) on the Atari suite and AlphaGo. In these online domains DQN-RL performs favorably thanks to its ability to continuously learn at super human speeds. Unfortunately, in many real world applications, such as in robotics, the learning rate is limited due to the speed at which th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Foundations of data science

سال: 2022

ISSN: ['2639-8001']

DOI: https://doi.org/10.3934/fods.2022012